Deterministic Sparse Column Based Matrix Reconstruction via Greedy Approximation of SVD
نویسندگان
چکیده
Given a matrix A ∈ Rm×n of rank r, and an integer k < r, the top k singular vectors provide the best rank-k approximation to A. When the columns of A have specific meaning, it is desirable to find (provably) “good” approximations to Ak which use only a small number of columns in A. Proposed solutions to this problem have thus far focused on randomized algorithms. Our main result is a simple greedy deterministic algorithm with guarantees on the performance and the number of columns chosen. Specifically, our greedy algorithm chooses c columns from A with c = O ( k2 log k 2 μ(A) ln ( √ k‖Ak‖F ‖A−Ak‖F )) such that ‖A− CgrC grA‖F ≤ (1 + ) ‖A−Ak‖F , where Cgr is the matrix composed of the c columns, C + gr is the pseudoinverse of Cgr (CgrC + grA is the best reconstruction of A from Cgr), and μ(A) is a measure of the coherence in the normalized columns of A. The running time of the algorithm is O(SV D(Ak)+mnc) where SV D(Ak) is the running time complexity of computing the first k singular vectors of A. To the best of our knowledge, this is the first deterministic algorithm with performance guarantees on the number of columns and a (1 + ) approximation ratio in Frobenius norm. The algorithm is quite simple and intuitive and is obtained by combining a generalization of the well known sparse approximation problem from information theory with an existence result on the possibility of sparse approximation. Tightening the analysis along either of these two dimensions would yield improved results.
منابع مشابه
Column subset selection via sparse approximation of SVD
Given a real matrix A ∈ Rm×n of rank r, and an integer k < r, the sum of the outer products of top k singular vectors scaled by the corresponding singular values provide the best rank-k approximation Ak to A. When the columns of A have specific meaning, it might be desirable to find good approximations to Ak which use a small number of columns of A. This paper provides a simple greedy algorithm...
متن کاملSpeech Enhancement using Adaptive Data-Based Dictionary Learning
In this paper, a speech enhancement method based on sparse representation of data frames has been presented. Speech enhancement is one of the most applicable areas in different signal processing fields. The objective of a speech enhancement system is improvement of either intelligibility or quality of the speech signals. This process is carried out using the speech signal processing techniques ...
متن کاملRandomized Iterative Hard Thresholding: A Fast Approximate MMSE Estimator for Sparse Approximations
Typical greedy algorithms for sparse reconstruction problems, such as orthogonal matching pursuit and iterative thresholding, seek strictly sparse solutions. Recent work in the literature suggests that given a priori knowledge of the distribution of the sparse signal coefficients, better results can be obtained by a weighted averaging of several sparse solutions. Such a combination of solutions...
متن کاملConvolutional Matching Pursuit and Dictionary Training
Here, {W,Z} are the dictionary and the coefficients, respectively, and zk is the kth column of Z. K, q, and λ are user selected parameters controlling the power of the model. More recently, many models with additional structure have been proposed. For example, in [9, 2], the dictionary elements are arranged in groups and the sparsity is on the group level. In [3, 5, 7], the dictionaries are con...
متن کاملECG Signal Reconstruction from Undersampled Measurement Using A Trained Overcomplete Dictionary
We propose a new approach to reconstructing ECG signal from undersampled data based on constructing a combined overcomplete dictionary. The dictionary is obtained by combining the trained dictionary by K-SVD dictionary learning algorithm with universal types of dictionary such as DCT or wavelet basis. Using the trained overcomplete dictionary, the proposed method can find sparse approximation b...
متن کامل